nonlocal block
- North America > United States (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (2 more...)
- North America > United States (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (2 more...)
Reviews: Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling
This paper followed the work of nonlocal neural network to discuss the properties of the diffusion and damping effect by analyzing the spectrum. The trained nonlocal network for image classification on CIFSR-10 data by incorporating nonlocal blocks into the 20-layer PreResNet presented most eigenvalues to be negative and convergence challenges when more blocks were added under certain learning rate and epochs. A rough look at the nonlocal operator representation under steady-state shed light that the output signals of the original nonlocal blocks tend to be damped out (diffused) along iterations by design. A new nonlocal network with namely nonlocal stage component was proposed to help overcome the aforementioned damped out problem by essentially replacing the residual part from weighted sum of the neighboring features to the difference between the neighboring signals and computed signals. Another proposed change is replacing the pairwise affinity function based on updated output to the input feature, which stays the same along the propagation with a stage.
Unifying Nonlocal Blocks for Neural Networks
Zhu, Lei, She, Qi, Li, Duo, Lu, Yanye, Kang, Xuejing, Hu, Jie, Wang, Changhu
The nonlocal-based blocks are designed for capturing long-range spatial-temporal dependencies in computer vision tasks. Although having shown excellent performance, they still lack the mechanism to encode the rich, structured information among elements in an image or video. In this paper, to theoretically analyze the property of these nonlocal-based blocks, we provide a new perspective to interpret them, where we view them as a set of graph filters generated on a fully-connected graph. Specifically, when choosing the Chebyshev graph filter, a unified formulation can be derived for explaining and analyzing the existing nonlocal-based blocks (e.g., nonlocal block, nonlocal stage, double attention block). Furthermore, by concerning the property of spectral, we propose an efficient and robust spectral nonlocal block, which can be more robust and flexible to catch long-range dependencies when inserted into deep neural networks than the existing nonlocal blocks. Experimental results demonstrate the clear-cut improvements and practical applicabilities of our method on image classification, action recognition, semantic segmentation, and person re-identification tasks.
- Asia > China > Guangdong Province > Shenzhen (0.04)
- Africa > Angola > Namibe Province > South Atlantic Ocean (0.04)
- Europe > Albania > Durrës County (0.04)
- (2 more...)
Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling
Tao, Yunzhe, Sun, Qi, Du, Qiang, Liu, Wei
Nonlocal neural networks have been proposed and shown to be effective in several computer vision tasks, where the nonlocal operations can directly capture long-range dependencies in the feature space. In this paper, we study the nature of diffusion and damping effect of nonlocal networks by doing spectrum analysis on the weight matrices of the well-trained networks, and then propose a new formulation of the nonlocal block. The new block not only learns the nonlocal interactions but also has stable dynamics, thus allowing deeper nonlocal structures. Moreover, we interpret our formulation from the general nonlocal modeling perspective, where we make connections between the proposed nonlocal network and other nonlocal models, such as nonlocal diffusion process and Markov jump process.
- North America > Mexico > Gulf of Mexico (0.14)
- North America > United States (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- (3 more...)
Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling
Tao, Yunzhe, Sun, Qi, Du, Qiang, Liu, Wei
Nonlocal neural networks have been proposed and shown to be effective in several computer vision tasks, where the nonlocal operations can directly capture long-range dependencies in the feature space. In this paper, we study the nature of diffusion and damping effect of nonlocal networks by doing spectrum analysis on the weight matrices of the well-trained networks, and then propose a new formulation of the nonlocal block. The new block not only learns the nonlocal interactions but also has stable dynamics, thus allowing deeper nonlocal structures. Moreover, we interpret our formulation from the general nonlocal modeling perspective, where we make connections between the proposed nonlocal network and other nonlocal models, such as nonlocal diffusion process and Markov jump process.
- North America > United States (0.04)
- North America > Mexico > Gulf of Mexico (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- (3 more...)
Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling
Tao, Yunzhe, Sun, Qi, Du, Qiang, Liu, Wei
Nonlocal neural networks have been proposed and shown to be effective in several computer vision tasks, where the nonlocal operations can directly capture long-range dependencies in the feature space. In this paper, we study the nature of diffusion and damping effect of nonlocal networks by doing the spectrum analysis on the weight matrices of the well-trained networks, and propose a new formulation of the nonlocal block. The new block not only learns the nonlocal interactions but also has stable dynamics and thus allows deeper nonlocal structures. Moreover, we interpret our formulation from the general nonlocal modeling perspective, where we make connections between the proposed nonlocal network and other nonlocal models, such as nonlocal diffusion processes and nonlocal Markov jump processes.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > China > Guangdong Province > Shenzhen (0.04)